Ep. #14, Quality Culture with Rosie Sherry of Ministry of Testing
In episode 14 of How It’s Tested, Eden Full Goh speaks with Rosie Sherry, founder of the Ministry of Testing, about her journey building one of the largest communities in software testing. Rosie shares insights on how testing roles have evolved, the impact of AI, and the importance of fostering a company-wide culture of quality. Tune in for a deep dive into the future of testing and quality assurance.
Rosie Sherry is the founder of the Ministry of Testing, a global community dedicated to advancing software testing through education, collaboration, and events. With over two decades of experience in QA, Rosie has become a leading advocate for building quality cultures in tech and empowering testers worldwide.
In episode 14 of How It’s Tested, Eden Full Goh speaks with Rosie Sherry, founder of the Ministry of Testing, about her journey building one of the largest communities in software testing. Rosie shares insights on how testing roles have evolved, the impact of AI, and the importance of fostering a company-wide culture of quality. Tune in for a deep dive into the future of testing and quality assurance.
transcript
Eden Full Goh: Hi Rosie. Thank you so much for joining us on the How It's Tested podcast today.
Rosie Sherry: Thanks Eden for having me.
Eden: I know you have a ton of experience building and founding testing communities, curating them, a lot of that is of course based on your own personal experiences in your career early on in the software industry.
And so I would really love to kind of hear from you in your own words, kind of your journey today up until today and kind of what led you to ultimately start Ministry of Testing, which is one of the longest standing communities in the industry. And I've seen resources and events again and again and again.
And so it's really cool to have a chance to talk to you about it and hear kind of like how you started Ministry of Testing. And also I think, the conversation today about where our industry is headed is going to be really exciting.
Rosie: Yeah, thank you. Yeah. Where should I start? I mean I've been in tech testing. My first job in tech was a testing job, very junior, no experience in tech in any kind of way. And that was around the year 2000.
I was working in a bank actually, so I didn't actually have a role as a tester initially, but I had a opportunity to UAT test an app that they were building and I kind of used that on my CV to get my first proper job in testing like a few months later.
I used that like as a way to say like I had a bit of experience in testing, right. But yeah, I mean like, I guess like the year 2000 it feels like a whole different era, than it is today. A whole different world.
Testing back then, it is like really different from what it is today. There was a lot of manual stuff, not a lot of information out there for testers. A lot of corporate testing I guess, and a lot of processes. Not necessarily very efficient testing in my opinion, but you know, it is what it is.
And yeah, I guess like I got my first job in testing and at the local web media company and basically like as soon as you get a first job and something that kind of becomes easier to move on to the next thing or find the next opportunity.
So like with within a year, I already had another job in testing, doing something much more technical. That was, to be honest, like way over my head. I didn't really understand the whole systems under test, it was quite intense.
But yeah, it was around that time it was a .com boom, like 2001 and I ended up getting laid off and after that I kind of went into, I did like freelance testing, I did contracts to local companies, worked on some fun projects and I just kept doing that for a few years.
Was also hanging out in local tech community in Brighton. I live in Brighton in the UK. So mixing like the local meetups kind of exposed me to the world of, or the concept of community.
And I ran some local meetups and I just got hooked on the idea of community was also continuing my career as a tester. And then, yeah, the more I did community, the more I was like, how can I make community my thing? How can I build community?
And as part of that I decided, because I knew about testing and there wasn't really a lot out there for testers, I felt a bit jealous of like what web designers had at the time. There was a lot of great content, a lot of people talking about things to advance the web industry, and I just felt like there was never anything really around for testers.
So I started Ministry of Testing at that point and that was 2007. And when I started it, I called it the Software Testing Club. So it started as an online web hosted forum.
And really, I just did it because I wanted to, I wanted an excuse to build community basically. And I really didn't have any expectations that it would go anywhere and I didn't push it rapidly, I just like kept trying to have fun and have interesting conversations, basically trying to create a space that I felt was valuable and for myself and for the community.
And I did that for about three years before it got kind of like a lot to manage. By that time I had two kids as well, so I was managing that in my life and after three years I kind of told myself that I had to turn it into a business so I would stop doing it because it was just like sucking up so much of my time.
That was when I decided to start a conference as a way to kind of try to make a living from the community. As they say, the rest is history. Yeah, we kept growing as a conference and as a place that kind of provided education and then we created an online platform and then people started asking us for conferences elsewhere.
So we did conferences elsewhere and 2019 we held nine conferences across the globe, which was a bit nuts to be honest. I don't think we'd ever do that again. But then, yeah, obviously the pandemic hit and things have changed since then, we've gone back to one a year and kind of changed our strategy around that. But yeah, it's definitely been a journey.
Eden: I mean it's incredible to just like even thinking about, you know, you were talking about the .com boom. And ever since you started Ministry of Testing, I feel like there's been more browsers and desktop configurations that have started up, the introduction of mobile devices, how the iOS and Android ecosystems have evolved.
And I think there's just been so much evolution and change in the industry and I think the role of testing has also evolved as Selenium and Appium Frameworks, Detox, Maestro, all of these kind of frameworks have started appearing.
And so I guess I'm curious like from when you started Ministry of Testing in like 2007 to even now, like the sort of the programming, the content, I'm assuming, there used to be more of a focus on kind of like manual testing when that was more prevalent, but the shift of kind of like the skillset that the technical skills that testers are looking to develop that has evolved.
I'd be curious to kind of hear your observations and reflections on just like what's kind of been the shift and yeah, the kind of content or the kind of programming, the kind of community, like how has that changed over the last couple decades?
Rosie: Yeah, I mean it's changed so much and I think that's a sign of like a healthy industry I guess that it has changed so much, right? And so many of those that exists today wasn't around like 15 years ago. I think definitely the industry is, it has its specialisms whether it's automation or specific tools or specific areas of knowledge.
I think testers are also generalist at the same time. So there's like, I guess maybe there's push and pull between being a specialist and a generalist and the fact that to do good testing quite often you need to know a lot about everything, you need to speak with your customers or understand the market or speak with various people in the team.
There's that aspect of it. It's like, there's no one else probably in teams that are doing that, right? Like developers have their head down doing like dev work and designers are so focused on design, but like testers are kind of often spread across teams trying to help improve so many potential areas.
There's definitely like a shift towards like so much more technical skills. But at the same time I think like technical skills are often like overrated and sometimes we just need like good communication skills or good understanding of systems and processes.
And I think a lot of software these days are not necessarily failing because of technical problems, but it's more like people problems or process problems. And perhaps that's where a lot of testers struggle.
Like, I guess like the recruitment process, technical skills are often like demanded upfront, but often they're not actually needed nearly as much as they are in reality. And so it kind of has this kind of mismatch of expectations.
I know a lot of automation experts who actually don't do a lot of automation day to day. I think the same happens for developers, right? You get hired as a developer and a lot of developers often get stuck doing different work naturally, not coding.
For me it's just great to see so much change and I like to look back in like chunks of years of like five year chunks and just think about where the community was or where the industry was and like five years ago or 10 years ago. And it's when you look at it from that perspective that you really start to notice like how things have changed.
Eden: Yeah, it's interesting what you were bringing up earlier about how oftentimes like, when something is not working on a product or at a company, it's not because like the testing process was flawed or something, it is usually like a breakdown in communication or some other skill gap or business process gap.
And I've definitely seen that pattern across some of the engineering and QA and product teams and marketing teams that we work with at Mobot is oftentimes like, oh, the test case didn't even the product specifications correctly because the product manager or something never specified like what the expected behavior of a product was supposed to be.
So, how is a regression test case supposed to be defined if there isn't even like a UAT best practice that was applied before something was brought on and implemented.
And so I'm curious like the kinds of conversations that you've had with the community and you know, 'cause I'm sure this comes up a lot, is just like, testing is often seen as a process that happens after everything else.
You know, quality assurance is sometimes not prioritized as much as design or engineering, but all of these processes need to work together.
But I'm curious, like what are some of the observations you've seen over the last few years to kind of bring quality assurance and testing closer to the other functions to kind of resolve some of the issues and patterns that we've kind of seen appear where if testing and quality is an afterthought rather than part of the original development cycle, things just don't go as well when shipping releases.
Rosie: Yeah, I guess two kind of trends that come to mind, one is like QA managers, there's definitely like a lot less of them these days and there's not as test teams on their own in their silo.
And testers often end up as part of the overall tech team I guess or specific areas that they're more muddled in with developers and designers and working more directly in there. And as a result there's obviously less QA managers.
But maybe there's more engineering managers as a result or I've seen testers who end up going into engineering management as kind of a career path as well. So, that's quite interesting and it's really nice to see like some of the members in the community, who we've seen like I guess like grow up or advance their career and watching them being like engineering leaders.
And I think that's, for me at least, it's wonderful to see. It's almost like there's this respect for our background as professional testers that we do know what we're doing or we do understand how our products work and how to manage them.
And the other one that springs to mind is quality coach has been talked about a lot more recently, Vernon Richards is someone who's spoken about it and Emna Ayadi recently at TestBash also did a talk about being a quality coach.
That's more like people who are often testers but are more focused on quality overall. They come in and they support the team to improve the quality overall in a coaching way and they're not necessarily going to be there forever.
It might be temporary, but it's more like educational and support to help the team look for risks or improve their processes and things like that.
Eden: I guess just to make sure I understand kind of the way that you're defining the difference between testing versus quality. Quality is more kind of the long-term overarching structure of how that process fits into the product development of a piece of software.
And testing is kind of more like the one-off task that's done at a release, to kind of just test to get a release out the door. Am I understanding that correctly or how would you kind of define quality versus testing?
Rosie: Yeah, it's a bit of a topic to be honest. It's like what do we call our people in Ministry of Testing? Because these days some people call themselves testers or automation experts, but some people call themselves like quality assurance professionals.
Others are now are calling themselves like quality engineering professionals or testers. And I'm not in the position to define it to be honest, but--
Quality is culture, right? That's kind of how I see it. Testing can't force a quality culture. It can support it or maybe like quality culture is something that supports a quality culture, right?
It's not necessarily testing, but yeah, I just think there's a lot of overlap and probably test is a somewhat generalist as well, so yes, they're testing, but they're also supporting the quality culture of the company.
They're also trying to think of the users, they're trying to look at the software from many perspectives that other people wouldn't be. So there's definitely a Venn diagram there, I think between testing and quality and what's that bit in the middle? I'm not sure off the top of my head.
Eden: I really love that what you just said about quality being a cultural thing and I think that applies across the whole organization and it isn't something that only testers or quality engineers or quality analysts care about.
The product manager needs to care about quality and needs to buy into the culture, the customer support team that's fielding all of the production issue complaints from users and then funneling that back to engineering and funneling that back to the quality team they need to be bought in as stakeholders of the quality culture at the company and business units that are defining deadlines for what features are being A/B tested and all of that.
Like everybody needs to be bought in and needs to understand and prioritize that quality is important in an organization. And I think that the way you put it was really helpful. Curious, like how your community has recommended quality being integrated back into the larger organization beyond just like a community of testers.
How are folks encouraged to talk to product managers or marketing folks across the company to make sure that testing and quality are kind of embedded across the company throughout?
Rosie: Yeah, I mean it's a difficult one. I think like in some senses, like test has often been embedded in part of the team, naturally brings that up, right? So they're not necessarily sitting on their own in the corner doing their work.
It's more collaborative and I think just the world of tech has gone that way with agile, with DevOps type stuff. Yeah, I just think it's really hard for testers to do good work on their own.
And I just like hope in general that like the days of like bugs being released into production and testers getting the blame of it, I really hope that's well into the past. I feel like it is. And I guess like what also springs to mind is like, having a culture of care across organizations.
I talk a lot about care in terms of the community work, I do. If we all care as a team, then we'll naturally just ship better products. We'll naturally talk more, we'll naturally point things out and try to fix them.
So for me it's like, I just love the aspect of people that care about the work that they're doing and if anything, I think the testing community they've really come leaps and bounds from that perspective.
Like if we think back to like 2007 when I was just starting up Ministry of Testing, there were just a handful of people who were blogging and generally white men and now it's just like, there's so many blogs, there's so many YouTube channels, there's people contributing everywhere, it's so hard to keep up, right?
And for me that's just like a sign that people care about testing. That's so good to see.
Eden: Yeah, I've definitely seen kind of what you're saying about this like pod structure, where testers and quality engineers are integrated more into the broader team.
Yeah, like I've seen, there's a transactions team, there's a app activations team and each of those teams have a couple of engineers, a product manager, a designer, and then a quality assurance tester or quality assurance engineer and they're kind of embedded together.
And then also when issues come through, there's definitely clearer lines of ownership that like, hey, this whole team owns this feature. And so, it isn't the tester's fault, right? Everybody needed to kind of be defining and shipping it together.
So yeah, there's definitely been in the last like five years, I've seen that same pattern too across some of the teams that we get to work with that like that that's becoming more and more prevalent and I think it's exciting to see where the industry will continue to shift as I hope more teams continue to adopt this.
Rosie: Absolutely.
Eden: Going back to one of your other points from earlier you were sort of saying, job descriptions for QA folks when they get hired, the job descriptions says, oh, you need this and this automation experience and it's actually, almost like kind of deceptively technical skill.
Which obviously it is incredibly important that many testers are technical, especially now as our industry has evolved, but they kind of join in, you know, join the team and then end up having to deal with all of these other business pressures, business priorities, and they're not doing as much technical automation as maybe originally depicted in the job description.
I'm curious how you think AI has changed, kind of this challenge or if it's going to continue to change this challenge. Curious if you've already kind of seen that in some of the folks in your community?
Rosie: Yeah, I mean at our conference that we had just last month, TestBash, we had a whole day dedicated to AI and testing and we've been exploring AI this year with a couple of online events as well.
I think AI is interesting. I guess the way I see it at the moment is it's definitely early days and there's still feels like that the outputs of AI people don't trust it.
There's a lot of hallucination type stuff and just crazy stuff that it outputs, but I think it's interesting, it's almost like, I guess like AI will become a specialism AI and testing.
We've had such a focus on test automation, but maybe that will balance itself out with AI needs or maybe like AI will end up doing a lot of the automation work instead of test automation experts or I think that's interesting.
For me, like with AI it's like a lot of the tool vendors, are kind of like adding AI onto the existing tools. I saw a tool a few weeks back that kind of caught my eye that's not a testing tool, but it basically, when problems occur, it tries to tell you about the problems of where they have recurred and then they send the information off to relevant people in the team.
So that kind of stood out to me because it's not trying to be like another test automation tool, it's just, it's more like trying to solve the problem I guess.
Like if there's a code problem, developers can often spend hours looking for the problem in the code and it ends up being a simple typo, but like if AI can like spot that for you like in minutes, that to me feels like using AI in a different kind of way rather than just kind of slapping it onto ways of how we test or how we have tested in the past.
So I guess from that perspective, I'd love to see AI kind of be used to help us test, but not necessarily in the way that we've always tested.
Eden: Yeah, I think like there's in the quality process and just like how AI can be applied, I feel like I've seen a lot of AI tools sort of start popping up on defining and generating new test cases or instructions for automation tools or maybe like building the kind of the initial scaffolding of a test.
I think I've seen a lot of AI tools do that. I think I've seen very few AI tools actually be following through with the execution of the test because the execution is the hard part.
And I think this is kind of where you've gotten into, you mentioned hallucinations, is we've also seen that at Mobot with, some of the early prototypes that we had encountered into tools that we were implementing ourselves at Mobot is like sometimes the AI will tell you to, "Oh, go and tap this button "that's not actually on the screen."
But it just thinks it should be on the screen. Which is almost funny because yeah, it reflects like how humans think. 'Cause you expect it to be a certain way, but that doesn't actually mean that that's actually what you're encountering in reality.
And so can the test continue? Can you keep going or is this where the test fails because now you've not met the user acceptance criteria. And so I think the execution and decision making piece of the testing process, I think that's where the opportunity around AI is.
And I think that's also an area that we're working on. But I think what's also exciting, and I think a little bit more defined is what you're also saying is like the results interpretation, the data ingestion and kind of like pattern matching that comes after, or even like in a post-production environment, being able to kind of like spot those patterns more proactively.
I think AI is really kind of book ending the process right now, but I think there's sort of like that middle chunk that like, yeah, there's still too many hallucinations. We're still very much in the early days in the industry and how it will be applied actually to quality that I think it will take time and it's definitely something our team is actively working on as well.
But yeah, I think it's not sort of like the magic bullet that sometimes it can be represented as. 'Cause I've seen a lot of tools that basically it's almost like it creates a to-do list for you that generates the test cases.
Like this is what you should test, but actually doing the things on the list and checking them off and making sure it was done correctly, I think that's actually the hard part and that's the responsibility of why, there need to be testers and quality managers and the embedding of quality throughout an organization is that ownership needs to continue to be handled by humans.
Rosie: There's also a lot of risks that haven't really been followed through and there's a lot of racism in AI. For example, there's a lot of ethical or potential ethical issues and it's kind of scary to be honest.
And testers from that perspective, I think are great. Like at the conference that we held, there were discussions about it and it's great to see like testers kind of speaking about AI and exploring AI and pointing out the problems, right?
And it's almost like the tech industry should probably pay more attention to what we're saying because we're trying to look at the problems with AI and still be positive about it, but actually explore AI with that risk factor because there are so many risks involved.
Even like people trying to pretend to be other people or cloning voices and all of these things that are coming up as a result of AI. Even like the other days, it's not necessarily testing related, but the fact that when you go to search for, let's say an animal on Google, so many of the results of the images are like AI generated images.
And it's like, this is the future of our world where, we won't know what's what's real or not anymore or it's just going to like mess up our sense of reality. As testers, we think about these things and we question them and yeah, it'd be nice if maybe we collaborated a bit more with other people or if developers came to our conferences to understand what we're talking about or what we're concerned about. 'Cause testers naturally look towards risk and try to point out.
Eden: Yeah, AI kind of depicts or presents to you what it wants or what things should be like or what we expect things to be like rather than what things actually are.
And I think that is complicated because especially the role of testing, and I think why it continues to be such a business critical process and part of a company and a business is because things are not always that simple, right?
It's not just you can check a box and testing is done and you get to move on. Like difficult decisions sometimes have to be made around prioritizing issues that come in or should we keep this A/B test or this campaign turned on, did this actually fulfill the requirements of the product specification and did this undergo user acceptance testing and it isn't this black and white done or not done thing.
And I think AI tries to simplify some of those things, but why kind of like the human strategic element of building a quality process is so important is because things are not that black and white. You can't just simply hallucinate the button that you want to have on the screen to click on and then move on.
The test did not pass because in reality something else happened that was unexpected or different and a human knows how to course correct or kind of work around that or make a call like, oh, does there need to be a hot fix?
Do we need to pause the release and go back and take the time to investigate this or remediate this. So I think there is so much that still needs to be human that I hope all of these AI tools will really accelerate and make it easier to make decisions and kind of like exactly like you were saying before, drill into exactly what's going wrong.
So no one has to be sent on like a wild goose chase for several hours to figure it out. But I think yeah, there's so much decision making process in a quality process that I don't think AI is at that level yet.
And so I think that's kind of the way that we see it at Mobot as well is that like, if there's ways to streamline, make things faster, provide more relevant information, that's great. But yeah, you have to be really, really careful.
We're still kind of in the early stages of the industry, but it's definitely exciting. But I think, yeah, there needs to be a continued dialogue and I think that's why having a community like Ministry of Testing is really valuable for that.
Because there needs to be a place where we can have folks from all around the world have these important conversations with each other from different companies, different experiences.
Otherwise there's just too much siloing and people are not, pointing out the patterns and kind of the challenges with AI and AI is one thing, but I think just broadly, in the whole industry.
So yeah, grateful that the community that you've built continues to exist.
Rosie: Thank you.
Eden: I guess one more question I have for you is, we touched on like browsers and testing frameworks have evolved over the years. You've also seen, of course, as Ministry of Testing has taken off, like it used to be all about the web, but there is sort of an increasing, demand for mobile, which is a different tech stack, a slightly different form factor and skillset.
Curious how you've kind of seen, like the conversation evolve with like web versus mobile. Are testers in the community focused on both or is there some specialization?
Like are there similarities and differences between web and mobile that you've seen in terms of the way people talk about best practices or quality processes?
Rosie: Yes. It's a good question. There is overlap, people do talk about mobile but probably not nearly as much as web. So maybe, there's a bit of a gap there to be talking more about it and like the challenges involved and the idea of like mobile labs for me, like running one yourself is just like, it sounds intense.
I'm not sure I personally ever want that responsibility to manage so many pieces of hardware. But I guess mobile has come a bit later and perhaps needs a bit more time to mature and figure out better ways to test on mobile that's efficient, right? But yeah, I mean I guess like you probably see more than me from the mobile perspective.
Eden: Yeah, I think it's definitely still, as weird as it sounds, mobile is definitely sort of even in still sometimes in the early days. 'Cause if we think about web and how long web browsers, Selenium, just like the dom, the document object model in the world of web is open source and it's been around for so long and everyone's used to it.
They're building new frameworks, whether it's Selenium or Cypress or Playwright and new tools pop up all the time and I think that's fantastic. But I think, yeah, what's really different about the world of mobile is iOS and Android are still primarily kind of controlled by Apple and Google and it's also been younger.
I think the trends, the way people have used phones, smartwatches, Android Auto, Apple CarPlay, I think those are still like relatively new things and I think we're still kind of in a world where new features, new use cases are being announced all the time.
I think cameras are still evolving, hardware is still evolving and so it's still kind of, on the earlier side when you contrast it to how stable it feels like web as a platform has been. And so I think that's something that we're all in on, on the Mobot side is we think it's going to get more complicated and we're here for it and we're here to build the solutions and tools to support that.
But it is interesting, and I'm not surprised to hear that a lot of the dialogue that goes on at Ministry of Testing is probably more holistic focused because of course, like there's different testers and careers from all walks of life, all industries, and not every team has a mobile app yet.
That's I think something that's going to change. I think people, as a society, we've kind of been still defining like when does it make sense to build a mobile app or when does it make sense to just use a mobile browser for something?
And so I think that is something that like, as a consumer, the consumer behavior is being defined. I think the role that quality is going to play with these different tech stacks that's going to evolve.
So I think in the next three to five years, things are going to pretty drastically, but I'm not surprised to hear that. Like there's still a lot that's being figured out and I think it'll be interesting to see how automation and AI play a role into that.
Rosie: Yeah, so there's even, I forget what you call it off the top of my head, but definitely a trend towards people not building the mobile apps and making it browser based and having the app kind of available on your phone, but it's just connected via a browser.
So it's almost like, well your mobile is essentially a mini computer or mini browser, but it has the extra challenges of well, you take it anywhere and as a result of taking it anywhere, it gets complicated because both the challenges of just like moving about with your phone I guess is one of the issues that comes up.
Eden: Yeah.
Rosie: Yeah. So I guess there's that difference. It's like there's mobile testing with apps and there's mobile testings was just like, using your browser and do companies want to maintain an app and their website separately? And I think that's a trend for some people, some companies to move away from that because it becomes a lot to maintain over a period of time.
Eden: I think the more succinct way to describe it that I've seen is like if your app is just reading information, like you're be probably better off doing that in a browser. If it's read only and the consumer who's interacting with the product doesn't have to input a lot of information.
There's not a lot of location services or push notifications or you have to be at a specific place at a specific time to do a certain action. There's no kind of like Bluetooth peripheral involved. Absolutely, there's definitely been a trend of like, yeah, at that point you might as well just open a browser.
But I think where we've kind of seen that like yeah, a mobile app is sort of necessitated is yes, when there's transactions, interactions, more complex use cases and then at that point it becomes unavoidable.
You kind of do have to build a mobile app and I think there have been explorations with having web views that are embedded inside native iOS and Android applications and then also, having like cross platform, you know, single code base like React Native or Flutter mobile applications and those can be really beneficial.
But I think when you sort of get into that world of like embedded web views in native apps, it's tricky to automate with Selenium or it's tricky to automate with playwright because now the web view is inside the native part, so you have to test the native part and Appium and Detox don't quite cover it and things like that.
And so I think there's definitely sort of a gap where that's where manual testing ends up kind of happening. And I think that's also, the area that we're interested in is like more realistic real world use cases that continue to need kind of like innovative tools that are developed there.
But yeah, I think there's definitely, a shift in how folks think about designing mobile applications, when is it necessary to have one? But I think like that's kind of my personal thesis on this, is that like the frivolous kind of useless read-only mobile apps, I think those will continue.
We're going to see fewer and fewer of those apps, but it's kind of like the apps where like you have to have it because it's a real world use case. I think those are going to become more prevalent and I think hardware is going to become more proliferated and fragmented.
But I think yeah, they're definitely needed to sort of be like this initial wave where everyone made a mobile app for almost everything and then people sort of pulled back and were like, "Oh wait, yeah, this can just be a browser thing."
But then there's other use cases that, you know, I think can't be avoided and it's just a much more embedded and holistic user experience if it is a native experience on mobile. And so that's kind of the trend that I've seen, but it's nice to know that you've kind of also observed the same thing from your community.
Rosie: Yeah, and I guess like trends, that's the way the world is going at the moment is they change so fast, right? So it's hard to keep up and like, I think like the trend at the moment is AI and people think they need to keep up. And to a certain extent I think we do.
But there's also a risk with like adopting things too fast. There's a lot of cost associated with trying to be first, but then you can also win by being first. It's like, you know, it's a really hard decision to make, whether you should adopt something or what path you should go down. And I guess that's just just tech for you, it's like, oh, it's complicated.
Eden: Yeah, for sure. I've really enjoyed this conversation Rosie, thank you so much for taking the time and kind of reminiscing and discussing with me sort of where the future is. I'm very excited for our community to continue to evolve and I'm really looking forward to participating and being a part of Ministry of Testing as well.
And I always keep tabs on what you guys are working on. So yeah, thank you for all that you do, building this community for our industry.
Rosie: Thank you for having me.
Content from the Library
How It's Tested Ep. #13, The Evolution of Testing and QA with Katja Obring
In episode 13 of How It’s Tested, Eden is joined by QA expert Katja Obring. Together they discuss Katja’s 20-year career journey...
How It's Tested Ep. #11, Frictionless Observability with Yechezkel Rabinovich of Groundcover
In episode 11 of How It's Tested, Eden Full Goh sits down with Yechezkel Rabinovich of Groundcover to delve into the evolving...
How It's Tested Ep. #9, Leading Engineering Teams with Dave Lewis of Mobot
In episode 9 of How It’s Tested, Eden Full Goh speaks with Dave Lewis of Mobot. This talk spotlights Dave’s experiences leading...